Error analysis of regularized least-square regression with Fredholm kernel
نویسندگان
چکیده
Learning with Fredholm kernel has attracted increasing attention recently since it can effectively utilize the data information to improve the prediction performance. Despite rapid progress on theoretical and experimental evaluations, its generalization analysis has not been explored in learning theory literature. In this paper, we establish the generalization bound of least square regularized regression with Fredholm kernel, which implies that the fast learning rate O(l−1) can be reached under mild capacity conditions. Simulated examples show that this Fredholm regression algorithm can achieve the satisfactory prediction performance.
منابع مشابه
Learning Rates of Least-Square Regularized Regression
This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert...
متن کاملReproducing Kernel Banach Spaces with the ℓ1 Norm II: Error Analysis for Regularized Least Square Regression
A typical approach in estimating the learning rate of a regularized learning scheme is to bound the approximation error by the sum of the sampling error, the hypothesis error and the regularization error. Using a reproducing kernel space that satisfies the linear representer theorem brings the advantage of discarding the hypothesis error from the sum automatically. Following this direction, we ...
متن کاملRegularized Least Square Regression with Spherical Polynomial Kernels
This article considers regularized least square regression on the sphere. It develops a theoretical analysis of the generalization performances of regularized least square regression algorithm with spherical polynomial kernels. The explicit bounds are derived for the excess risk error. The learning rates depend on the eigenvalues of spherical polynomial integral operators and on the dimension o...
متن کاملConvergence Rate of Coefficient Regularized Kernel-based Learning Algorithms
We investigate machine learning for the least square regression with data dependent hypothesis and coefficient regularization algorithms based on general kernels. We provide some estimates for the learning raters of both regression and classification when the hypothesis spaces are sample dependent. Under a weak condition on the kernels we derive learning error by estimating the rate of some K-f...
متن کاملApplication of integral operator for regularized least-square regression
In this paper, we study the consistency of the regularized least square regression in a general reproducing kernel Hilbert spaces. We characterized the compactness of the inclusion map from a reproducing kernel Hilbert space to the space of continuous functions and showed that the capacity based analysis by uniform covering numbers may fail in a very general setting. We prove the consistency an...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neurocomputing
دوره 249 شماره
صفحات -
تاریخ انتشار 2017